Skorzystaj z wyszukiwarki lub indeksu alfabetycznego.
Przykłady: pci, /dev/null, functional unit, embedded system, pseudo-tty, nfs.
1 definition found
From The Free On-line Dictionary of Computing (05 January 2017) [foldoc]:
fuzzy logic
fuzzy computing
A superset of Boolean logic dealing with the concept of
partial truth -- truth values between "completely true" and
"completely false". It was introduced by Dr. Lotfi Zadeh of
UCB in the 1960's as a means to model the uncertainty of
natural language.
Any specific theory may be generalised from a discrete (or
"crisp") form to a continuous (fuzzy) form, e.g. "fuzzy
calculus", "fuzzy differential equations" etc. Fuzzy logic
replaces Boolean truth values with degrees of truth which are
very similar to probabilities except that they need not sum to
one. Instead of an assertion pred(X), meaning that X
definitely has the property associated with predicate
"pred", we have a truth function truth(pred(X)) which gives
the degree of truth that X has that property. We can combine
such values using the standard definitions of fuzzy logic:
truth(not x) = 1.0 - truth(x)
truth(x and y) = minimum (truth(x), truth(y))
truth(x or y) = maximum (truth(x), truth(y))
(There are other possible definitions for "and" and "or",
e.g. using sum and product). If truth values are restricted to
0 and 1 then these functions behave just like their Boolean
counterparts. This is known as the "extension principle".
Just as a Boolean predicate asserts that its argument
definitely belongs to some subset of all objects, a fuzzy
predicate gives the degree of truth with which its argument
belongs to a fuzzy subset.
Usenet newsgroup: comp.ai.fuzzy.
E-mail servers: